Degenerate Expectation-Maximization Algorithm for Local Dimension Reduction
نویسندگان
چکیده
Dimension reduction techniques based on principal component analysis (PCA) and factor analysis are commonly used in statistical data analysis. The effectiveness of these methods is limited by their global nature. Recent efforts have focused on relaxing global restrictions in order to identify subsets of data that are concentrated on lower dimensional subspaces. In this paper, we propose an adaptive local dimension reduction method, called the Degenerate Expectation-Maximization Algorithm (DEM). This method is based on the finite mixture model. We demonstrate that the DEM yields significantly better results than the local PCA (LPCA) and other related methods in a variety of synthetic and real datasets. The DEM algorithm can be used in various applications ranging from clustering to information retrieval.
منابع مشابه
The EM Algorithm for Mixtures of Factor Analyzers
Factor analysis, a statistical method for modeling the covariance structure of high dimensional data using a small number of latent variables, can be extended by allowing di erent local factor models in di erent regions of the input space. This results in a model which concurrently performs clustering and dimensionality reduction, and can be thought of as a reduced dimension mixture of Gaussian...
متن کاملSet-Oriented Dimension Reduction: Localizing Principal Component Analysis Via Hidden Markov Models
We present a method for simultaneous dimension reduction and metastability analysis of high dimensional time series. The approach is based on the combination of hidden Markov models (HMMs) and principal component analysis. We derive optimal estimators for the loglikelihood functional and employ the Expectation Maximization algorithm for its numerical optimization. We demonstrate the performance...
متن کاملOn an Eigenvector-Dependent Nonlinear Eigenvalue Problem
We first provide existence and uniqueness conditions for the solvability of an algebraic eigenvalue problem with eigenvector nonlinearity. We then present a local and global convergence analysis for a self-consistent field (SCF) iteration for solving the problem. The well-known sin Θ theorem in the perturbation theory of Hermitian matrices plays a central role. The near-optimality of the local ...
متن کاملTen Steps of EM Suffice for Mixtures of Two Gaussians
We provide global convergence guarantees for the expectation-maximization (EM) algorithm applied to mixtures of two Gaussians with known covariance matrices. We show that EM converges geometrically to the correct mean vectors, and provide simple, closed-form expressions for the convergence rate. As a simple illustration, we show that in one dimension ten steps of the EM algorithm initialized at...
متن کاملOnline expectation-maximization type algorithms for parameter estimation in general state space models
In this paper we present new online algorithms to estimate static parameters in nonlinear non Gaussian state space models. These algorithms rely on online Expectation-Maximization (EM) type algorithms. Contrary to standard Sequential Monte Carlo (SMC) methods recently proposed in the literature, these algorithms do not degenerate over time.
متن کامل